Block-proximal methods with spatially adapted acceleration
نویسنده
چکیده
We study and develop (stochastic) primal–dual block-coordinate descent methods based on the method of Chambolle and Pock. Our methods have known convergence rates for the iterates and the ergodic gap: O(1/N 2) if each each block is strongly convex, O(1/N ) if no convexity is present, and more generally a mixed rate O(1/N 2) + O(1/N ) for strongly convex blocks, if only some blocks are strongly convex. Additional novelties of our methods include blockwise-adapted step lengths and acceleration, as well as the ability update both the primal and dual variables randomly in blocks under a very light compatibility condition. In other words, these variants of our methods are doubly-stochastic. We test the proposed methods on various image processing problems, where we employ pixelwise-adapted acceleration. Get the version from h p://tuomov.iki.fi/publications/, citations may be broken in this one due arXiv’s inability to support biblatex.
منابع مشابه
A Universal Catalyst for First-Order Optimization
We introduce a generic scheme for accelerating first-order optimization methods in the sense of Nesterov, which builds upon a new analysis of the accelerated proximal point algorithm. Our approach consists of minimizing a convex objective by approximately solving a sequence of well-chosen auxiliary problems, leading to faster convergence. This strategy applies to a large class of algorithms, in...
متن کاملAssessment of aortic regurgitation by the acceleration flow signal void proximal to the leaking orifice in cinemagnetic resonance imaging.
BACKGROUND The proximal acceleration flow region is a laminar flow field that is located immediately upstream from the leaking orifice. The purpose of this study was to evaluate whether cinemagnetic resonance imaging can provide information regarding the proximal acceleration flow region in patients with aortic regurgitation and to analyze the relation between the area of the proximal accelerat...
متن کاملStochastic Proximal Gradient Descent with Acceleration Techniques
Proximal gradient descent (PGD) and stochastic proximal gradient descent (SPGD) are popular methods for solving regularized risk minimization problems in machine learning and statistics. In this paper, we propose and analyze an accelerated variant of these methods in the mini-batch setting. This method incorporates two acceleration techniques: one is Nesterov’s acceleration method, and the othe...
متن کاملModified Linear Approximation for Assessment of Rigid Block Dynamics
This study proposes a new linear approximation for solving the dynamic response equations of a rocking rigid block. Linearization assumptions which have already been used by Hounser and other researchers cannot be valid for all rocking blocks with various slenderness ratios and dimensions; hence, developing new methods which can result in better approximation of governing equations while keepin...
متن کاملMultiframe image super-resolution adapted with local spatial information.
Super-resolution image reconstruction, which has been a hot research topic in recent years, is a process to reconstruct high-resolution images from shifted, low-resolution, degraded observations. Among the available reconstruction frameworks, the maximum a posteriori (MAP) model is widely used. However, existing methods usually employ a fixed prior item and regularization parameter for the enti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1609.07373 شماره
صفحات -
تاریخ انتشار 2016